EM Algorithms for Probabilistic Mapping Networks

نویسندگان

  • Haizhou Li
  • Yifan Gong
  • Jean-Paul Haton
چکیده

The Expectation-Maximization (EM) algorithm is a general technique for maximum likelihood estimation (MLE). In this paper we present several of the important theoretical and practical issues associated with Gaussian mixture mod-eling (GMM) within the EM framework. First, we propose an EM algorithm for estimating the parameters of a special GMM structure, named a probablistic mapping network (PMN), where the Gaussian probability density function is realized as an internal node. In this way, the EM algorithm is extended to deal with the supervised learning of a multicategory classiication problem and serve as a parameter estimator of the neural network Gaussian classiier. Then, a generalized EM (GEM) algorithm is developed as an alternative to the MLE problem of PMN. This is followed by a discussion on the computational considerations and algorithmic comparisons. It is shown that GEM converges faster than EM to the same solution space. The computational eeciency and the numerical stability of the training algorithm beneet from the well-established EM framework. The eeectiveness of the proposed PMN architecture and developed EM algorithms are assessed by conducting a set of speaker recognition experiments. INRIA Apprentissage de RRseaux Probabilistes par Algorithmes EM RRsumm : L'algorithme "Expectation-Maximisation" (EM) est une technique gg-nnrale pour l'estimation par maximum de vraisemblance. Ce document prrsente quelques aspects thhoriques et pratiques importants liis la moddlisation par mm-langes de gaussienne (GMM) dans le cadre EM. Nous proposons d'abord un algo-rithme EM pour estimer le parammtres d'un moddle GMM particulier, appell PMN ("probabilistic mapping network"), dans lequel la fonction de densitt de probabilitt gaussienne est calculle par un noeud interne du rrseau. Nons prrsentons ensuite un algorithm EM ggnnraliss (GEM) comme une solution alternative l'estimation par maximum de vraisemblance d'un PMN. Cette prrsentation est complltte par une discussion sur les aspects calculatoires comparrs des deux algorithmes EM et GEM. Nous montrons en particulier que GEM converge plus vite que EM vers le mmme espace de solutions. L'eecacitt pratique de l'architecture PMN et des algo-rithmes EM proposss est valuue travers un ensemble d'exemples pratiques dans le domaine de la reconnaissance de locuteurs.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Application of Probabilistic Clustering Algorithms to Determine Mineralization Areas in Regional-Scale Exploration Studies

In this work, we aim to identify the mineralization areas for the next exploration phases. Thus, the probabilistic clustering algorithms due to the use of appropriate measures, the possibility of working with datasets with missing values, and the lack of trapping in local optimal are used to determine the multi-element geochemical anomalies. Four probabilistic clustering algorithms, namely PHC,...

متن کامل

Novel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection

In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...

متن کامل

Load-Frequency Control: a GA based Bayesian Networks Multi-agent System

Bayesian Networks (BN) provides a robust probabilistic method of reasoning under uncertainty. They have been successfully applied in a variety of real-world tasks but they have received little attention in the area of load-frequency control (LFC). In practice, LFC systems use proportional-integral controllers. However since these controllers are designed using a linear model, the nonlinearities...

متن کامل

A Viterbi-like algorithm and EM learning for statistical abduction

We propose statistical abduction as a rstorder logical framework for representing and learning probabilistic knowledge. It combines logical abduction with a parameterized distribution over abducibles. We show that probability computation, a Viterbilike algorithm and EM learning for statistical abduction achieve the same eÆciency as specilzed algorithms for HMMs (hidden Markov models), PCFGs (pr...

متن کامل

An Introduction to Inference and Learning in Bayesian Networks

Bayesian networks (BNs) are modern tools for modeling phenomena in dynamic and static systems and are used in different subjects such as disease diagnosis, weather forecasting, decision making and clustering. A BN is a graphical-probabilistic model which represents causal relations among random variables and consists of a directed acyclic graph and a set of conditional probabilities. Structure...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995